Stochastic analysis of the diffusion least mean square and normalized least mean square algorithms for cyclostationary white Gaussian and non‐ <scp>Gaussian</scp> inputs

نویسندگان

چکیده

The diffusion least mean square (DLMS) and the normalized (DNLMS) algorithms are analyzed for a network having fusion center. This structure reduces dimensionality of resulting stochastic models while preserving important properties. analysis is done in system identification framework cyclostationary white nodal inputs. parameters vary according to random walk model. cyclostationarity modeled by periodic time variations input powers. holds all types distributions except with infinite variance. derived consist simple scalar recursions. These recursions facilitate understanding mean-square dependence upon 1) weighting coefficients, 2) kurtosis cyclostationarities, 3) noise powers, 4) unknown parameter increments. Optimization node coefficients studied. Also investigated stability two coefficients. Significant differences found between behaviors DLMS DNLMS non-Gaussian Simulations provide strong support theory.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Diffusion Least Mean Square: Simulations

In this technical report we analyse the performance of diffusion strategies applied to the Least-Mean-Square adaptive filter. We configure a network of cooperative agents running adaptive filters and discuss their behaviour when compared with a non-cooperative agent which represents the average of the network. The analysis provides conditions under which diversity in the filter parameters is be...

متن کامل

Mean square convergence analysis for kernel least mean square algorithm

In this paper, we study the mean square convergence of the kernel least mean square (KLMS). The fundamental energy conservation relation has been established in feature space. Starting from the energy conservation relation, we carry out the mean square convergence analysis and obtain several important theoretical results, including an upper bound on step size that guarantees the mean square con...

متن کامل

Least Mean Square Algorithm

The Least Mean Square (LMS) algorithm, introduced by Widrow and Hoff in 1959 [12] is an adaptive algorithm, which uses a gradient-based method of steepest decent [10]. LMS algorithm uses the estimates of the gradient vector from the available data. LMS incorporates an iterative procedure that makes successive corrections to the weight vector in the direction of the negative of the gradient vect...

متن کامل

Least Mean Square Algorithm

The Least Mean Square (LMS) algorithm, introduced by Widrow and Hoff in 1959 [12] is an adaptive algorithm, which uses a gradient-based method of steepest decent [10]. LMS algorithm uses the estimates of the gradient vector from the available data. LMS incorporates an iterative procedure that makes successive corrections to the weight vector in the direction of the negative of the gradient vect...

متن کامل

Kernel Least Mean Square Algorithm

A simple, yet powerful, learning method is presented by combining the famed kernel trick and the least-mean-square (LMS) algorithm, called the KLMS. General properties of the KLMS algorithm are demonstrated regarding its well-posedness in very high dimensional spaces using Tikhonov regularization theory. An experiment is studied to support our conclusion that the KLMS algorithm can be readily u...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Adaptive Control and Signal Processing

سال: 2021

ISSN: ['0890-6327', '1099-1115']

DOI: https://doi.org/10.1002/acs.3334